18 research outputs found

    Evasive Internet: Reducing Internet Vulnerability through Transient Destination

    Get PDF
    In the current Internet architecture, traffic is commonly routed to its destination using DNS names that are mapped to IP addresses, yet there are no inherent means for receivers to attribute sources of traffic to senders or for receivers to authorize senders. These deficiencies leave the Internet and its connected hosts vulnerable to a wide range of attacks including denial-of-service and misrepresentation (spoofing, phishing, etc.) which continue to cause material damage. In this mechanism to combat these vulnerabilities by introducing attribution and authorization into the network using a transient addressing scheme to establish attribution through DNS, establish authorization at the host, and enforce authorization and attribution in the network. In this work, I developed and characterized a system for effecting in-network enforcement at the router, and I demonstrate the enforcement is possible on current commodity hardware at sustained throughput rates Ill above common Internet connection rates. The current internet architecture allows hosts to send arbitrary IP packets across a network, which may not reflect valid source address information. IP spoofing and Denial of service attacks are ubiquitous. Filtering techniques are not sufficient enough to counter these attacks. Current Internet design calls for in-network authentication of addresses and attribution of traffic they generate. In this architecture the destination can only be reached through a valid capability. The aim of this dissertation is to implement Evasive Internet Protocol for the end hosts and measure the preliminary performance as compared to current internet protocols

    Classifying Dominant Congested Path Using Correlation Factors

    Get PDF
    Traffic classification has wide applications in network management, from security monitoring to quality of service measurements. Recent research tends to apply machine learning techniques to flow statistical feature based classification methods. The nearest neighbor (NN)-based method has exhibited superior classification performance. It also has several important advantages, such as no requirements of training procedure, no risk of overfitting of parameters, and naturally being able to handle a huge number of classes. However, the performance of NN classifier can be severely affected if the size of training data is small. In this paper, we propose a novel nonparametric approach for traffic classification, which can improve the classification performance effectively by incorporating correlated information into the classification process. We analyze the new classification approach and its performance benefit from both theoretical and empirical perspectives. A large number of experiments are carried out on two real-world traffic data sets to validate the proposed approach. The results show the traffic classification performance can be improved significantly even under the extreme difficult circumstance of very few training samples

    Efficient Secure For Tracking Based Text Detection and Recognition from Web Videos

    Get PDF
    Sensor nodes forming a network and using wireless communications are highly useful in a variety of applications including battle field (military) surveillance, building security, medical and health services, environmental monitoring in harsh conditions, for scientific investigations on other planets, etc. But these wireless sensors are resource constricted: limited power supply, bandwidth for communication, processing speed, and memory space. One possible way of achieve maximum utilization of those constrained resource is applying signal processing and compressing the sensor readings. Usually, processing data consumes much less power than transmitting data in wireless medium, so it is effective to apply data compression by trading computation for communication before transmitting data for reducing total power consumption by a sensor node. However the existing state of the art compression algorithms are not suitable for wireless sensor nodes due to their limited resource.Therefore there is a need to design signal processing (compression) algorithms considering the resource constraint of wireless sensors. In our work, we designed a lightweight codec system aiming surveillance as a target application. In designing the codec system, we have proposed new design ideas and also tweak the existing encoding algorithms to fit the target application. Also during data transmission among sensors and between sensors and base station, the data has to be secured. We have addressed some security issues by assessing the security of wavelet tree shuffling as the only security mechanism

    Data Mining in Personalized Web Searching Data�s

    Get PDF
    World Wide Web (WWW) is very popular and commonly used internet�s information retrieval service. Nowa-days commonly used task on internet is web search. User gets variety of related information for their queries. To provide more relevant and effective results to user, Personalization technique is used. Personalized web search refer to search information that is tailored specifically to a person�s interests by incorporating information about query provided. Two general types of approaches to personalizing search results are modifying user�s query and reranking search results. Several personalized web search techniques based on web contents, web link structure, browsing history ,user profiles and user queries. The proposed paper is to represent survey on various techniques of personalization

    Collaborative Tagging and Taxonomy by Vector Space Approach

    Get PDF
    Collaborative tagging or group tagging is tagging performed by a group of users usually to support in re-finding the items. The limberness of tagging allows users to classify their collections of items in the ways that they find useful, but the personalized variety of expressions can present challenges when searching and browsing. When users can liberally choose tags (users create and apply public tags to online items as different to selecting terms from a proscribed terminology based on the users feedback), the resulting metadata can consist of homonyms (the same tags used with dissimilar implication) and synonyms (multiple tags for the same concept) which may direct to inappropriate connections between items and wasteful searches for information about a subject. Collaborative tagging requires the enforcement of method that enables users to protect their privacy by allowing them to hide certain user-generated contents without making them useless for the purposes they have been provided in a given online service. This means that privacy-preserving mechanisms must not harmfully affect the service truthfulness and usefulness.The proposed approach defends the user privacy to a certain level by reducing the tags that make a user profile let somebody see partiality toward certain categories of interest or feedback

    Web Prediction Mechanism for User Personalized Search

    Get PDF
    Personalized net search (PNS) has incontestible its effectiveness in up the standard of assorted search services on the web. However, evidences show that users� reluctance to disclose their non-public data throughout search has become a serious barrier for the wide proliferation of PNS. We have a tendency to study privacy protection in PNS applications that model user preferences as graded user profiles. We have a tendency to propose a PNS framework referred to as UPS (User customizable Privacy-preserving Search) that may adaptively generalize profiles by queries whereas respecting user specified privacy necessities. Our runtime generalization aims at hanging a balance between 2 prognostic metrics that assess the utility of personalization and therefore the privacy risk of exposing the generalized profile. We have a tendency to gift 2 greedy algorithms, specifically GreedyDP and GreedyIL, for runtime generalization. We have a tendency to additionally give a web prediction mechanism for deciding whether or not personalizing a question is useful. intensive experiments demonstrate the effectiveness of our framework. The experimental results additionally reveal that GreedyIL considerably outperforms GreedyDP in terms of potency

    Enabling Public Audit Ability and Data Dynamics for Storage Security in Data Mining

    Get PDF
    Data mining has been envisioned as the next-generation architecture of IT Enterprise. It moves the application software and databases to the centralized large data centers, where the management of the data and services may not be fully trustworthy. This unique paradigm brings about many new security challenges, which have not been well understood. This work studies the problem of ensuring the integrity of data storage in Data mining. In particular, we consider the task of allowing a third party auditor (TPA), on behalf of the cloud client, to verify the integrity of the dynamic data stored in the cloud. The introduction of TPA eliminates the involvement of the client through the auditing of whether his data stored in the cloud are indeed intact, which can be important in achieving economies of scale for Data mining. The support for data dynamics via the most general forms of data operation, such as block modification, insertion, and deletion, is also a significant step toward practicality, since services in Data mining are not limited to archive or backup data only

    Automated Network Diagnosis Prevent Problems

    Get PDF
    Software that performs well in one environment may be unusably slow in another, and determining the root cause is time-consuming and error-prone, even in environments in which all the data may be available. End users have an even more difficult time trying to diagnose system performance, since both software and network problems. Diagnosing performance degradation in distributed systems is a complex and difficult task.The source of performance stalls in a distributed system can be automatically detected and diagnosed with very limited information the dependency graph of data flows through the system, and a few counters common to almost all data processing systems. An automated approach for diagnosing performance stalls in networked systems. Flow Diagnoser requires as little as two bits of information per module to make a diagnosis: one to indicate whether the module is actively processing data, and one to indicate whether the module is waiting on its dependents. Flow Diagnoser is implemented in two distinct environments: an individual host�s networking stack, and a distributed streams processing system

    PACCE -A Real Genuine Key Swap over Protocols

    Get PDF
    A Secure protocols for password-based user authentication unit well-studied among the crypto logical literature but have did not see wide-spread adoption on the internet; most proposals up to presently want full modifications to the Transport Layer Security (TLS) protocol, making preparation onerous. Recently many traditional styles square measure projected among that a cryptographically secure countersign-based mutual authentication protocol is run among a confidential (but not primarily authenticated) channel like TLS; the countersign protocol is sure to the established channel to forestall active attacks. Such protocols unit helpful in apply for a ramification of reasons: ability to validate server certificates and can all told likelihood be enforced with no modifications to the secure channel protocol library. It offers a scientific study of such authentication protocols. Building on recent advances in modelling TLS, we've associate inclination to provide a correct definition of the meant security goal, that we've associate inclination to decision password-authenticated and confidential channel institution (PACCE). we've associate inclination to imply generically that combining a secure channel protocol, like TLS, Our prototypes supported TLS unit accessible as a cross-platform client-side Firefox browser extension furthermore as associate golem application and a server-side internet application which will simply be place in on servers

    Effectiveness of Social Media Community Using Optimized Clustering Algorithm

    Get PDF
    Now-a-days social media is used to the introduce new issues and discussion on social media. More number of users participates in the discussion via social media. Different users belong to different kind of groups. Positive and negative comments will be posted by user and they will participate in discussion. Here we proposed system to group different kind of users and system specifies from which category they belong to. For example film industry, politician etc. Once the social media data such as a user messages are parsed and network relationships are identified, data mining techniques can be applied to group of different types of communities. We used K-Means clustering algorithm to cluster data. In this system we detect communities by the clustering messages from large streams of social data. Our proposed algorithm gives better a clustering result and provides a novel use-case of grouping user communities based on their activities. This application is used to the identify group of people who viewed the post and commented on the post. This helps to categorize the users
    corecore